Resilient Binary Neural Network
نویسندگان
چکیده
Binary neural networks (BNNs) have received ever-increasing popularity for their great capability of reducing storage burden as well quickening inference time. However, there is a severe performance drop compared with {real-valued} networks, due to its intrinsic frequent weight oscillation during training. In this paper, we introduce Resilient Neural Network (ReBNN) mitigate the better BNNs' We identify that mainly stems from non-parametric scaling factor. To address issue, propose parameterize factor and weighted reconstruction loss build an adaptive training objective. For first time, show controlled by balanced parameter attached loss, which provides theoretical foundation it in back propagation. Based on this, learn our ReBNN calculating based maximum magnitude, can effectively resilient process. Extensive experiments are conducted upon various network models, such ResNet Faster-RCNN computer vision, BERT natural language processing. The results demonstrate overwhelming over prior arts. example, achieves 66.9% Top-1 accuracy ResNet-18 backbone ImageNet dataset, surpassing existing state-of-the-arts significant margin. Our code open-sourced at https://github.com/SteveTsui/ReBNN.
منابع مشابه
ResBinNet: Residual Binary Neural Network
Recent efforts on training light-weight binary neural networks offer promising execution/memory efficiency. This paper introduces ResBinNet, which is a composition of two interlinked methodologies aiming to address the slow convergence speed and limited accuracy of binary convolutional neural networks. The first method, called residual binarization, learns a multi-level binary representation fo...
متن کاملResbinnet: Residual Binary Neural Network
Recent efforts on training light-weight binary neural networks offer promising execution/memory efficiency. This paper introduces ResBinNet, which is a composition of two interlinked methodologies aiming to address the slow convergence speed and limited accuracy of binary convolutional neural networks. The first method, called residual binarization, learns a multi-level binary representation fo...
متن کاملBICONN: A Binary Competitive Neural Network
In this paper a competitive neural network with binary synaptic weights is proposed. The aim of this network is to cluster or categorize binary input data. The neural network uses a learning mechanism based on activity levels that generates new binary synaptic weights that become medianoids of the clusters or categorizes that are being formed by process units of the network, since the medianoid...
متن کاملTowards Accurate Binary Convolutional Neural Network
We introduce a novel scheme to train binary convolutional neural networks (CNNs) – CNNs with weights and activations constrained to {-1,+1} at run-time. It has been known that using binary weights and activations drastically reduce memory size and accesses, and can replace arithmetic operations with more efficient bitwise operations, leading to much faster test-time inference and lower power co...
متن کاملEstimation of Binary Infinite Dilute Diffusion Coefficient Using Artificial Neural Network
In this study, the use of the three-layer feed forward neural network has been investigated for estimating of infinite dilute diffusion coefficient ( D12 ) of supercritical fluid (SCF), liquid and gas binary systems. Infinite dilute diffusion coefficient was spotted as a function of critical temperature, critical pressure, critical volume, normal boiling point, molecular volume in normal boilin...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence
سال: 2023
ISSN: ['2159-5399', '2374-3468']
DOI: https://doi.org/10.1609/aaai.v37i9.26261